Page 1

Displaying 1 – 7 of 7

Showing per page

( h , Φ ) -entropy differential metric

María Luisa Menéndez, Domingo Morales, Leandro Pardo, Miquel Salicrú (1997)

Applications of Mathematics

Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on ( h , Φ ) -entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic...

( R , S ) -information radius of type t and comparison of experiments

Inder Jeet Taneja, Luis Pardo, D. Morales (1991)

Applications of Mathematics

Various information, divergence and distance measures have been used by researchers to compare experiments using classical approaches such as those of Blackwell, Bayesian ets. Blackwell's [1] idea of comparing two statistical experiments is based on the existence of stochastic transformations. Using this idea of Blackwell, as well as the classical bayesian approach, we have compared statistical experiments by considering unified scalar parametric generalizations of Jensen difference divergence measure....

φ PHI-divergences, sufficiency, Bayes sufficiency, and deficiency

Friedrich Liese (2012)

Kybernetika

The paper studies the relations between φ -divergences and fundamental concepts of decision theory such as sufficiency, Bayes sufficiency, and LeCam’s deficiency. A new and considerably simplified approach is given to the spectral representation of φ -divergences already established in Österreicher and Feldman [28] under restrictive conditions and in Liese and Vajda [22], [23] in the general form. The simplification is achieved by a new integral representation of convex functions in terms of elementary...

Currently displaying 1 – 7 of 7

Page 1